Search results for "statistical [Methods]"
showing 10 items of 1664 documents
Approximation of functions over manifolds : A Moving Least-Squares approach
2021
We present an algorithm for approximating a function defined over a $d$-dimensional manifold utilizing only noisy function values at locations sampled from the manifold with noise. To produce the approximation we do not require any knowledge regarding the manifold other than its dimension $d$. We use the Manifold Moving Least-Squares approach of (Sober and Levin 2016) to reconstruct the atlas of charts and the approximation is built on-top of those charts. The resulting approximant is shown to be a function defined over a neighborhood of a manifold, approximating the originally sampled manifold. In other words, given a new point, located near the manifold, the approximation can be evaluated…
A Wideband MIMO Channel Model Derived From the Geometric Elliptical Scattering Model
2006
In this paper, we present a reference model for a wideband multiple-input multiple-output (MIMO) channel based on the geometric elliptical scattering model. The model takes into account the exact relationship between the angle of departure (AOD) and the angle of arrival (AOA). Based on this relationship, the statistical properties of the reference model are studied. Analytical solutions are presented for the three- dimensional (3D) space-time cross-correlation function (CCF), the temporal autocorrelation function (ACF), the 2D space CCF, and finally the frequency correlation function (FCF). The correlation properties are studied and visualized under the assumption of isotropic as well as no…
Thermodynamics of the Classical Planar Ferromagnet Close to the Zero-Temperature Critical Point: A Many-Body Approach
2012
We explore the low-temperature thermodynamic properties and crossovers of ad-dimensional classical planar Heisenberg ferromagnet in a longitudinal magnetic field close to its field-induced zero-temperature critical point by employing the two-time Green’s function formalism in classical statistical mechanics. By means of a classical Callen-like method for the magnetization and the Tyablikov-like decoupling procedure, we obtain, for anyd, a low-temperature critical scenario which is quite similar to the one found for the quantum counterpart. Remarkably, ford>2the discrimination between the two cases is found to be related to the different values of the shift exponent which governs the beha…
Estimation of confidence limits for descriptive indexes derived from autoregressive analysis of time series: Methods and application to heart rate va…
2017
The growing interest in personalized medicine requires making inferences from descriptive indexes estimated from individual recordings of physiological signals, with statistical analyses focused on individual differences between/within subjects, rather than comparing supposedly homogeneous cohorts. To this end, methods to compute confidence limits of individual estimates of descriptive indexes are needed. This study introduces numerical methods to compute such confidence limits and perform statistical comparisons between indexes derived from autoregressive (AR) modeling of individual time series. Analytical approaches are generally not viable, because the indexes are usually nonlinear funct…
An Estimative Model of Automated Valuation Method in Italy
2017
The Automated Valuation Method (AVM) is a computer software program that analyzes data using an automated process. It is related to the process of appraising an universe of real estate properties, using common data and standard appraisal methodologies. Generally, the AVM is based on quantitative models (statistical, mathematical, econometric, etc.), related to the valuation of the properties gathered in homogeneous groups (by use and location) for which are collected samples of market data. The real estate data are collected regularly and systematically. Within the AVM, the proposed valuation scheme is an uniequational model to value properties in terms of widespread availability of sample …
Hierarchical modeling for rare event detection and cell subset alignment across flow cytometry samples.
2013
Flow cytometry is the prototypical assay for multi-parameter single cell analysis, and is essential in vaccine and biomarker research for the enumeration of antigen-specific lymphocytes that are often found in extremely low frequencies (0.1% or less). Standard analysis of flow cytometry data relies on visual identification of cell subsets by experts, a process that is subjective and often difficult to reproduce. An alternative and more objective approach is the use of statistical models to identify cell subsets of interest in an automated fashion. Two specific challenges for automated analysis are to detect extremely low frequency event subsets without biasing the estimate by pre-processing…
Geometrical Modeling of Non-Stationary Polarimetric Vehicular Radio Channels
2019
This paper presents a geometry-based statistical model (GBSM) of polarimetric wideband multipath radio channels for vehicle-to-vehicle (V2V) communications. The proposed model captures the effects of depolarization caused by multipath propagation, and it also accounts for the non-stationary characteristics of wideband V2V channels. This is a novel feature, because the existing polarimetric channel models are built on the assumption that the channel is a wide-sense stationary random process. In the modeling framework described in this paper, the channel depolarization function is given by a linear transformation in the form of a simple rotation matrix. This linear transformation is transpare…
Methodological considerations for interrupted time series analysis in radiation epidemiology: an overview
2021
Interrupted time series analysis (ITSA) is a method that can be applied to evaluate health outcomes in populations exposed to ionizing radiation following major radiological events. Using aggregated time series data, ITSA evaluates whether the time trend of a health indicator shows a change associated with the radiological event. That is, ITSA checks whether there is a statistically significant discrepancy between the projection of a pre-event trend and the data empirically observed after the event. Conducting ITSA requires one to consider specific methodological issues due to unique threats to internal validity that make ITSA prone to bias. We here discuss the strengths and limitations of …
Entropy-Based Classifier Enhancement to Handle Imbalanced Class Problem
2017
The paper presents a possible enhancement of entropy-based classifiers to handle problems, caused by the class imbalance in the original dataset. The proposed method was tested on synthetic data in order to analyse its robustness in the controlled environment with different class proportions. As also the proposed method was tested on the real medical data with imbalanced classes and compared to the original classification algorithm results. The medical field was chosen for testing due to frequent situations with uneven class ratios.
Are nonlinear model-free conditional entropy approaches for the assessment of cardiac control complexity superior to the linear model-based one?
2016
Objective : We test the hypothesis that the linear model-based (MB) approach for the estimation of conditional entropy (CE) can be utilized to assess the complexity of the cardiac control in healthy individuals. Methods : An MB estimate of CE was tested in an experimental protocol (i.e., the graded head-up tilt) known to produce a gradual decrease of cardiac control complexity as a result of the progressive vagal withdrawal and concomitant sympathetic activation. The MB approach was compared with traditionally exploited nonlinear model-free (MF) techniques such as corrected approximate entropy, sample entropy, corrected CE, two k -nearest-neighbor CE procedures and permutation CE. Electroca…